翻訳と辞書
Words near each other
・ Quantum energy teleportation
・ Quantum entanglement
・ Quantum eraser experiment
・ Quantum ergodicity
・ Quantum error correction
・ Quantum ESPRESSO
・ Quantum evolution
・ Quantum excitation (accelerator physics)
・ Quantum feedback
・ Quantum fiction
・ Quantitative risk assessment software
・ Quantitative structure–activity relationship
・ Quantitative susceptibility mapping
・ Quantitative trait locus
・ Quantitative value investing
Quantities of information
・ Quantities, Units and Symbols in Physical Chemistry
・ Quantity
・ Quantity adjustment
・ Quantity calculus
・ Quantity Is Job 1
・ Quantity of Books v. Kansas
・ Quantity surveyor
・ Quantity take-off
・ Quantity theory of money
・ Quantization
・ Quantization (image processing)
・ Quantization (linguistics)
・ Quantization (music)
・ Quantization (physics)


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Quantities of information : ウィキペディア英語版
Quantities of information

The mathematical theory of information is based on probability theory and statistics, and measures information with several quantities of information. The choice of logarithmic base in the following formulae determines the unit of information entropy that is used. The most common unit of information is the bit, based on the binary logarithm. Other units include the nat, based on the natural logarithm, and the hartley, based on the base 10 or common logarithm.
In what follows, an expression of the form p \log p \, is considered by convention to be equal to zero whenever ''p'' is zero. This is justified because \lim_ p \log p = 0 for any logarithmic base.
==Self-information==
Shannon derived a measure of information content called the self-information or "surprisal" of a message ''m'':
: I(m) = \log \left( \frac \right) = - \log( p(m) ) \,
where p(m) = \mathrm(M=m) is the probability that message ''m'' is chosen from all possible choices in the message space M. The base of the logarithm only affects a scaling factor and, consequently, the units in which the measured information content is expressed. If the logarithm is base 2, the measure of information is expressed in units of bits.
Information is transferred from a source to a recipient only if the recipient of the information did not already have the information to begin with. Messages that convey information that is certain to happen and already known by the recipient contain no real information. Infrequently occurring messages contain more information than more frequently occurring messages. This fact is reflected in the above equation - a certain message, i.e. of probability 1, has an information measure of zero. In addition, a compound message of two (or more) unrelated (or mutually independent) messages would have a quantity of information that is the sum of the measures of information of each message individually. That fact is also reflected in the above equation, supporting the validity of its derivation.
An example: The weather forecast broadcast is: "Tonight's forecast: Dark. Continued darkness until widely scattered light in the morning." This message contains almost no information. However, a forecast of a snowstorm would certainly contain information since such does not happen every evening. There would be an even greater amount of information in an accurate forecast of snow for a warm location, such as Miami. The amount of information in a forecast of snow for a location where it never snows (impossible event) is the highest (infinity).

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Quantities of information」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.